49 research outputs found

    Automatic supervision of temperature, humidity, and luminance with an Assistant Personal Robot

    Get PDF
    Smart environments and Ambient Intelligence (AmI) technologies are defining the future society where energy optimization and intelligent management are essential for a sustainable advance. Mobile robotics is also making an important contribution to this advance with the integration of sensors and intelligent processing algorithms. This paper presents the application of an Assistant Personal Robot (APR) as an autonomous agent for temperature, humidity, and luminance supervision in human-frequented areas. The robot multiagent capabilities allow gathering sensor information while exploring or performing specific tasks and then verifying human comfortability levels. The proposed methodology creates information maps with the distribution of temperature, humidity, and luminance and interprets such information in terms of comfort and warns about corrective actuations if required

    Experimental characterization of the twin-eye laser mouse sensor

    Get PDF
    This paper proposes the experimental characterization of a laser mouse sensor used in some optical mouse devices. The sensor characterized is called twin-eye laser mouse sensor and uses the Doppler effect to measure displacement as an alternative to optical flow-based mouse sensors. The experimental characterization showed similar measurement performances to optical flow sensors except in the sensitivity to height changes and when measuring nonlinear displacements, where the twin-eye sensor offered better performance. The measurement principle of this optical sensor can be applied to the development of alternative inexpensive applications that require planar displacement measurement and poor sensitivity to -axis changes such as mobile robotics.The authors acknowledge the support of the Government of Catalonia (Comissionat per a Universitats i Recerca, Departament d’Innovació, Universitats i Empresa) and the European Social Fund

    An embedded real-time red peach detection system based on an OV7670 camera, ARM Cortex-M4 processor and 3D Look-Up Tables

    Get PDF
    This work proposes the development of an embedded real-time fruit detection system for future automatic fruit harvesting. The proposed embedded system is based on an ARM Cortex-M4 (STM32F407VGT6) processor and an Omnivision OV7670 color camera. The future goal of this embedded vision system will be to control a robotized arm to automatically select and pick some fruit directly from the tree. The complete embedded system has been designed to be placed directly in the gripper tool of the future robotized harvesting arm. The embedded system will be able to perform real-time fruit detection and tracking by using a three-dimensional look-up-table (LUT) defined in the RGB color space and optimized for fruit picking. Additionally, two different methodologies for creating optimized 3D LUTs based on existing linear color models and fruit histograms were implemented in this work and compared for the case of red peaches. The resulting system is able to acquire general and zoomed orchard images and to update the relative tracking information of a red peach in the tree ten times per second
    corecore